Linearized Smooth Additive Classifiers
نویسنده
چکیده
We consider a framework for learning additive classifiers based on regularized empirical risk minimization, where the regularization favors “smooth” functions. We present representations of classifiers for which the optimization problem can be efficiently solved. The first family of such classifiers are derived from a penalized spline formulation due to Eilers and Marx, which is modified to enabled linearization. The second is a novel family of classifiers that are based on classes of orthogonal basis functions with othogonal derivatives. Both these families lead to explicit feature embeddings that can be used with off-the-shelf linear solvers such as LIBLINEAR to obtain additive classifiers. The proposed family of classifiers offer better trade-offs between training time, memory overhead and classifier accuracy, compared to the state-of-the-art in additive classifier training.
منابع مشابه
Linearized Additive Classifiers
We revisit the additive model learning literature and adapt a penalized spline formulation due to Eilers and Marx [4], to train additive classifiers efficiently. We also propose two new embeddings based two classes of orthogonal basis with orthogonal derivatives, which can also be used to efficiently learn additive classifiers. This paper follows the popular theme in the current literature wher...
متن کاملTHESIS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY On weak and strong convergence of numerical approximations of stochastic partial differential equations
This thesis is concerned with numerical approximation of linear stochastic partial differential equations driven by additive noise. In the first part, we develop a framework for the analysis of weak convergence and within this framework we analyze the stochastic heat equation, the stochastic wave equation, and the linearized stochastic Cahn-Hilliard, or the linearized Cahn-Hilliard-Cook equatio...
متن کاملBurgers Equation with Affine Linear Noise: Dynamics and Stability
The main objective of this article is to analyse the dynamics of Burgers equation on the unit interval, driven by affine linear white noise. It is shown that the solution field of the stochastic Burgers equation generates a smooth perfect and locally compacting cocycle on the energy space L2([0, 1],R). Using multiplicative ergodic theory techniques, we establish the existence of a discrete non-...
متن کاملWeak Convergence of Finite Element Approximations of Linear Stochastic Evolution Equations with Additive Noise Ii. Fully Discrete Schemes
We present an abstract framework for analyzing the weak error of fully discrete approximation schemes for linear evolution equations driven by additive Gaussian noise. First, an abstract representation formula is derived for sufficiently smooth test functions. The formula is then applied to the wave equation, where the spatial approximation is done via the standard continuous finite element met...
متن کاملApplication of ensemble learning techniques to model the atmospheric concentration of SO2
In view of pollution prediction modeling, the study adopts homogenous (random forest, bagging, and additive regression) and heterogeneous (voting) ensemble classifiers to predict the atmospheric concentration of Sulphur dioxide. For model validation, results were compared against widely known single base classifiers such as support vector machine, multilayer perceptron, linear regression and re...
متن کامل